Bottom-up Deep Learning using the Hebbian Principle
نویسندگان
چکیده
The “fire together, wire together” Hebbian learning model is a central principle in neuroscience, but, surprisingly, it has found limited applicability in modern machine learning. In this paper, we show that neuro-plausible variants of competitive Hebbian learning provide a promising foundation for bottom-up deep learning. We propose an unsupervised learning algorithm termed Adaptive Hebbian Learning (AHL) which trains rapidly, with minimal tuning, while providing sparse, distributed neural codes. We obtain excellent classification results for standard image datasets with deep convolutional nets using AHL as a building block, with a final SVM layer. We also propose a Discriminative Hebbian Learning (DHL) algorithm that can utilize class labels, by combining Hebbian and anti-Hebbian mechanisms in a manner inspired by dopamine modulated learning. We show that combining AHL for lower layers with DHL for higher layers is a promising strategy for deep learning.
منابع مشابه
Learning Sparse, Distributed Representations using the Hebbian Principle
The “fire together, wire together” Hebbian model is a central principle for learning in neuroscience, but surprisingly, it has found limited applicability in modern machine learning. In this paper, we take a first step towards bridging this gap, by developing flavors of competitive Hebbian learning which produce sparse, distributed neural codes using online adaptation with minimal tuning. We pr...
متن کاملNonlinear Hebbian Learning as a Unifying Principle in Receptive Field Formation
The development of sensory receptive fields has been modeled in the past by a variety of models including normative models such as sparse coding or independent component analysis and bottom-up models such as spike-timing dependent plasticity or the Bienenstock-Cooper-Munro model of synaptic plasticity. Here we show that the above variety of approaches can all be unified into a single common pri...
متن کاملReal-time Hebbian Learning from Autoencoder Features for Control Tasks
Neural plasticity and in particular Hebbian learning play an important role in many research areas related to artficial life. By allowing artificial neural networks (ANNs) to adjust their weights in real time, Hebbian ANNs can adapt over their lifetime. However, even as researchers improve and extend Hebbian learning, a fundamental limitation of such systems is that they learn correlations betw...
متن کاملNonlinear Hebbian learning
The development of sensory receptive fields has been modeled in the past by a variety of models including normative models such as sparse coding or independent component analysis and bottom-up models such as spike-timing dependent plasticity or the Bienenstock-Cooper-Munro model of synaptic plasticity. Here we show that the above variety of approaches can all be unified into a single common pri...
متن کاملDynamic Model of Visual Recognition Predicts Neural Response Properties in the Visual Cortex
The responses of visual cortical neurons during fixation tasks can be significantly modulated by stimuli from beyond the classical receptive field. Modulatory effects in neural responses have also been recently reported in a task where a monkey freely views a natural scene. In this article, we describe a hierarchical network model of visual recognition that explains these experimental observati...
متن کامل